l1-regularized Outlier Isolation and Regression
نویسندگان
چکیده
This paper proposed a new regression model called l1regularized outlier isolation and regression (LOIRE) and a fast algorithm based on block coordinate descent to solve this model. Besides, assuming outliers are gross errors following a Bernoulli process, this paper also presented a Bernoulli estimate model which, in theory, should be very accurate and robust due to its complete elimination of affections caused by outliers. Though this Bernoulli estimate is hard to solve, it could be approximately achieved through a process which takes LOIRE as an important intermediate step. As a result, the approximate Bernoulli estimate is a good combination of Bernoulli estimate’s accuracy and LOIRE regression’s efficiency with several simulations conducted to strongly verify this point. Moreover, LOIRE can be further extended to realize robust rank factorization which is powerful in recovering low-rank component from massive corruptions. Extensive experimental results showed that the proposed method outperforms stateof-the-art methods like RPCA and GoDec in the aspect of computation speed with a competitive performance.
منابع مشابه
An Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization
Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems that uses a preconditioned conjugate gr...
متن کاملEfficient L1 Regularized Logistic Regression
L1 regularized logistic regression is now a workhorse of machine learning: it is widely used for many classification problems, particularly ones with many features. L1 regularized logistic regression requires solving a convex optimization problem. However, standard algorithms for solving convex optimization problems do not scale well enough to handle the large datasets encountered in many pract...
متن کاملSharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso
for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) is recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions on sample complexity are characterized as a sharp threshold to guarantee successful recovery of the support union. This model has been previously studied via l1/l∞regularized Lasso by Negahban & Wain...
متن کاملMultiplicative Updates for L1-Regularized Linear and Logistic Regression
Multiplicative update rules have proven useful in many areas of machine learning. Simple to implement, guaranteed to converge, they account in part for the widespread popularity of algorithms such as nonnegative matrix factorization and Expectation-Maximization. In this paper, we show how to derive multiplicative updates for problems in L1-regularized linear and logistic regression. For L1–regu...
متن کاملFast Active-set-type Algorithms for L1-regularized Linear Regression
In this paper, we investigate new active-settype methods for l1-regularized linear regression that overcome some difficulties of existing active set methods. By showing a relationship between l1-regularized linear regression and the linear complementarity problem with bounds, we present a fast active-set-type method, called block principal pivoting. This method accelerates computation by allowi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1406.0156 شماره
صفحات -
تاریخ انتشار 2014